1
Introduction to Intelligent Optimization: WU-SCI1005
WU-SCI1005 Lecture 1
00:00

Optimization is the mathematical process of finding the "best" solution—either minimizing or maximizing an objective function—within a defined feasible region constrained by specific rules.

Classical vs. Intelligent Methods

  • Newton-Raphson Method: An iterative root-finding approach using second-order derivatives (Hessian).
  • Gradient Descent: A first-order method that moves toward the local minimum by following the negative gradient.
  • Evolutionary Algorithms (EAs): Stochastic, population-based search methods inspired by biological natural selection.

Critical Concepts

It is crucial to distinguish between the Decision Vector (the variables we change) and the Objective Function (the metric for success).

Encoding Pitfalls
Beware of the Vanishing Gradient in calculus-based methods and Hamming Cliffs in binary-coded EAs. A single decimal increment (e.g., 7 to 8) can require flipping all bits (0111 to 1000), creating a "cliff" that hinders search efficiency. Use Gray Coding to mitigate this.
Python Implementation: Gradient Descent
Question 1
Why is a Convex Optimization problem considered "easier" than a non-convex one?
Any local optimum is guaranteed to be the global optimum.
It does not require the calculation of derivatives.
It utilizes population-based stochastic search.
It requires significantly less memory to compute.
Question 2
In the context of Evolutionary Algorithms, what does the "Phenotype" represent?
The binary or Gray encoding of the variables.
The actual expressed features or performance of a solution.
Case Study: Maximizing Triangle Area
Read the scenario below and answer the formulation questions.
Consider the problem of maximizing the area of a right-angled triangle where the length of the hypotenuse $c$ is fixed.
Q
1. Identify the decision variables and the objective function.
Answer:
Variables: The lengths of the two legs, $a$ and $b$.
Objective Function: Maximize $Area = \frac{1}{2} \cdot a \cdot b$.
Q
2. State the constraint based on the geometric properties.
Answer:
Based on the Pythagorean theorem, the constraint is: $a^2 + b^2 = c^2$.
Q
3. If using the Newton-Raphson method, what matrix must be calculated to account for second-order partial derivatives?
Answer:
The Hessian Matrix ($H$), which contains all second-order partial derivatives of the objective function.